A conjugate directions approach to improve the limited-memory BFGS method

نویسندگان

  • Jan Vlcek
  • Ladislav Luksan
چکیده

Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. In case of quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. Global convergence of algorithm is established for convex sufficiently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method significantly.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

The Limited Memory Conjugate Gradient Method

In theory, the successive gradients generated by the conjugate gradient method applied to a quadratic should be orthogonal. However, for some ill-conditioned problems, orthogonality is quickly lost due to rounding errors, and convergence is much slower than expected. A limited memory version of the nonlinear conjugate gradient method is developed. The memory is used to both detect the loss of o...

متن کامل

Limited Memory Bfgs Updating in a Trust–region Framework

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix–vector multiplication. In this paper it is observed that limited memory updates to the Hessian approximatio...

متن کامل

Solving Limited-Memory BFGS Systems with Generalized Diagonal Updates

In this paper, we investigate a formula to solve systems of the form (Bk + D)x = y, where Bk comes from a limited-memory BFGS quasi-Newton method and D is a diagonal matrix with diagonal entries di,i ≥ σ for some σ > 0. These types of systems arise naturally in large-scale optimization. We show that provided a simple condition holds on B0 and σ, the system (Bk + D)x = y can be solved via a recu...

متن کامل

Enriched Methods for Large-Scale Unconstrained Optimization

This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method L BFGS and a Hessian free Newton method HFN in such a way that the information collected by one type of iteration improves the performance of the other Curvature information about the objective function is stored in the form of a limited memory matrix and plays the dual role of preco...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Applied Mathematics and Computation

دوره 219  شماره 

صفحات  -

تاریخ انتشار 2012